Posts Tagged ‘4K’

HDMI 2.1 Update – Pretty Much Status Quo

Last Thursday, a joint press conference was held in New York City by the HDMI Licensing Administrator to update attendees on the latest version of HDMI – version 2.1.

V2.1, which was officially announced at CES in 2017, represents a quantum leap over earlier versions. It’s the first HDMI architecture to use a packet-based signaling structure, unlike earlier versions that employed transition-minimized differential signaling (TMDS). By moving to a packet transport (an architecture which V 2.1 apparently borrowed a lot from DisplayPort, according to my sources), the maximum data rate could be expanded several-fold from the previous cap of 18 gigabits per second (Gb/s) to a stratospheric 48 Gb/s.

What’s more, the clock reference can now travel embedded in one of the four lanes. Previously, HDMI versions up to 2.0 were limited to three signal lanes and one clock lane. And of course, a digital packet-based signal stream lends itself well to compression, accomplished with VESA’s Display Stream Compression (DSC) system that is also the basis for Aptovision’s Blue River NT technology.

The HDMI Forum simply had to kick up the performance of the interface. Version 2.0, announced five years ago, was perceived by many (including me) to be too slow right out of the gate, especially when compared to DisplayPort 1.2 (18 Gb/s vs. 21.6 Gb/s). That perception was prescient: Just half a decade later, Ultra HDTVs are rapidly approaching the unit shipment numbers of Full HD models, and the bandwidth demands of high dynamic range (HDR) imaging with wide color gamuts (WCG) need much faster highways, especially with RGB (4:4:4) color encoding and 10-bit and 12-bit color rendering.

And if we needed any more proof that a faster interface was overdue, along comes 8K. Samsung is already shipping an 8K TV in the U.S. as of this writing, and Sharp has introduced a model in Japan. LG’s bringing out an 8K OLED TV in early 2019, and Dell has a 32-inch 8K LCD monitor for your desktop.

To drive this point home, IHS analyst Paul Gagnon showed numbers that call for 430,000 shipments of 8K TVs in 2019, growing to 1.9 million in 2020 and to 5.4 million in 2022. 70% of that capacity is expected to go to China, with North America making up 15% market share and western Europe 7%. Presumably, at least one of the signal inputs on these TVs will support HDMI 2.1, as even a basic 8K video signal (60p, 10-bit 4:2:0) will require a data rate of about 36 Gb/s, while a 4:2:2 version demands 48 Gb/s – right at the red line. (DSC would cut both of those rates in half).

Aside from stating that over 900 million HDMI-equipped devices are expected to ship in 2019 (including everything from medical cameras to karaoke machines,) HDMI Licensing CEO Rob Tobias didn’t offer much in the way of real news. But I had a few deeper questions, the first of which was “Is there now native support for optical interfaces in the HDMI 2.1 standard?” (Answer – no, not yet.)

My next question was about manufacturers of V2.1 transmitter/receiver chipsets. Had any been announced that could actually support 48 Gb/s? According to Tobias, HDMI Forum member Socionext, a chip manufacturer in Japan, has begun production on said chipsets. I followed that reply up with a question about manufacturer support for DSC in televisions and other CE devices, but couldn’t get a specific answer.

Much of the discussion among these panel members and David Meyer (director of technical content for CEDIA), Brad Bramy, VP of marketing for HDMI LA, and Scott Kleinle, director of product management for Legrand (a supplier to the CEDIA industry) was focusing on future-proofing residential installations that used HDMI interconnects.

But why not just go optical for all HDMI 2.1 connections and guarantee future-proofing? The responses I got to my last question were mostly along the line of “The installer just wants it to work the first time.” Yes, there are faster (Ultra High Speed) HDMI cables available now to work with V2.1 connections. But an HDMI cable that has to run 20, 30, or 40 feet at over a GHz clock rate is a pretty fat cable!

Multimode fiber cable is inexpensive compared to Cat 6 cable and the terminations are not difficult to install. Running strands of fiber through conduit, stone, and behind walls seems to be the most logical solution at the required speeds and is certainly what I’d recommend to installers in the commercial AV market. Properly terminated, optical fiber works the first time and very time and can run over a mile without significant signal degradation.

Once again, the HDMI Forum will have a booth at CES in the lower South Hall. With a new display wrinkle lurking in the shadows – high frame rate (HDR) video – there will be more upward pressure than ever on data rates for display connections. HDMI 2.1 may be up to the task (most likely aided by DSC), so I will be curious to see if there are any 8K/120 demos in Las Vegas. – PP

NAB 2018 In The Rear View Mirror

I just returned from my annual visit to the NAB Show in Las Vegas and the overall impression was of an industry (or industries) marching in place. Many booths were smaller; there were plenty of empty spaces filled with tables and chairs for eating and lounging, and at times you could hear crickets chirping in the North and Central Halls.  (Not so the South Hall, which was a madhouse all three days I visited.)

There are a number of possible reasons for this lack of energy. The broadcast and film industries are taking the first steps to move to IP backbones for everything from production to post and distribution, and it’s moving slowly. Even so, there was no shortage of vendors trying to convince booth visitors that AV-over-IT is the way to go, chop-chop!

Some NAB exhibitors that were formerly powerhouses in traditional media production infrastructures have staked their entire business model on IT, with flashy exhibits featuring powerful codecs, cloud media storage and retrieval, high dynamic range (HDR) imaging, and production workflows (editing, color correction, and visual effects) all interconnected via an IT infrastructure.

And, of course, there is now a SMPTE standard for transporting professional media over managed AV networks (note the word “managed”), and that’s ST 2110. The pertinent documents that define the standards are (to date) SMPTE ST 2110-10/-20/-30 for addressing system concerns and uncompressed video and audio streams, and SMPTE ST 2110-21 for specifying traffic shaping and delivery timing of uncompressed video.

No doubt about it – the Central Hall booths were definitely smaller and quieter this year.

 

Canon’s Larry Thorpe and Ivo Norenberg talked about the company’s new 50-1000mm zoom lens for Full HD cameras.

 

BlackMagic Design’s Pocket Cinema 4K Camera is quite popular – and affordable.

Others at NAB weren’t so sure about this rush to IT and extolled the virtues of next-generation SDI (6G, 12G, and even 24G). Their argument is that deterministic video doesn’t always travel well with the non-real-time traffic you find on networks. And the “pro” SDI crowd may have an argument, based on all of the 12G connectivity demos we saw. 3G video, to be more specific, runs at about 2.97 Gb/s, so a 12G connection would be good for 11.88 Gb/s – fast enough to transport an uncompressed 4K/60 video signal with 8-bit 4:2:2 color or 10-bit 4:2:0 color.

I’ve talked about 8K video and displays in previous columns, but mostly from a science experiment perspective. Well, we were quite surprised – perhaps pleasantly – to see Sharp exhibiting at NAB, showing an entire acquisition, editing, production, storage, and display system for 8K video. (Yes, that Sharp, the same guys that make those huge LCD displays. Now owned by Hon Hai precision industries.)

Sharp’s 8K broadcast camera, more accurately the 8C-B60A, uses a single Super 35mm sensor with effective resolution of 7680×4320 pixels arrayed in a Bayer format. That’s 16 times the resolution of a Full HD camera, which means data rates that are 16x that of 3G SDI. In case you are math challenged, we’re talking in the range of 48 Gb/s of data for a 4320p/60 video signal with 8-bit 4:2:2 color, which requires four 12G connections.

Sharp is building 8K cameras for live coverage of the 2020 Tokyo Olympics.

 

NHK demonstrated an 8K 240Hz slow motion video playback system, along with other 8K goodies.

 

Soliton demonstrated H.265 encoding across multiple platforms, including Android devices.

And this isn’t a science experiment at all. Sharp is building cameras for the live 8K broadcasts to take place at the 2020 Tokyo Olympics, originating from Japanese broadcast network NHK. By now, this should be old hat, as NHK has been covering the Olympics in 8K since 2012 and showed different approaches to home viewing in Las Vegas. They also impressed with demos of 8K “slo-mo” video at a frame rate of 240 Hz, and yes, it is practical and ready to roll.

In the NHK booth, you could also watch a demonstration of 8K/60 video traveling through a 10 Gb/s switch using so-called mezzanine compression based on the TiCo system. In this case, NHK was using 5:1 TiCo compression to slow down a 40 Gb/s 8K/60 video stream to 8 Gb/s. (Four 12G video connections would result in a bit rate of nearly 48 Gb/s in case you’re wondering.)

Not far from NHK’s booth last year was a virtual city of companies showing virtual reality (VR) and augmented reality (AR) hardware and software. That was about twice the size of the VR/AR exhibits in 2016, so I expected to find a sprawling metropolis of VR goodies. Instead, I came across a very large food court and lots of partitioned-off space. Turns out, what was left of the VR companies occupied a small pavilion known as “Immersive Storytelling.” Is VR the next 3D? (Probably not, but you couldn’t be blamed for thinking that.)

Panasonic’s got a 55-inch 4K OLED monitor for client viewing.

 

Epson showed an ultra short-throw laser projection system with excellent edge-to-edge sharpness.

 

The gadgeteers at NTT built a drone with a spinning LED sign shaped like a globe. Why? Because they could, I suppose.

Upstairs in the South Hall, there were dozens of companies hawking video compression tools, streaming and cloud services, targeted ad insertion, audience analytics, and a bunch of other buzzwords I’m probably getting too old to completely understand. (It will be interesting to see how many of these enterprises are still around a year from now.)

But my primary goal in that hall was to talk to folks from the Alliance for Open Media coalition. In case you haven’t heard of this group, they’ve been promoting an open-source, royalty-free codec labeled AV-1 for “next-generation 4K video.” There are at least 18 prominent members of the group and you may recognize a few of them, such as Google, Apple, Mozilla, YouTube, Netflix, Facebook, and VideoLAN.

And that they’re promoting is a codec that is very similar to HEVC H.265, which is made up of lots of intellectual property that requires licensing from an organization known as MPEG-LA (Licensing Authority, not Los Angeles). The AOM contingent thinks it is taking WAY too long to get H.265 off the ground and would rather just make a suitable codec free to anyone who wants to use it to speed up the transition to 4K video.

In addition to giving out red, yellow, green, and blue lollipops, Google had its jump 360-degree camera out for inspection.

 

Technicolor claims to have solved the problem of rapid switching between different HDR formats streaming in the same program.

 

Keep an eye on the AV-1 codec. It could really upset the apple cart.

Of course, they didn’t have a ready answer when I questioned the future viability of any company that had sunk millions of dollars into H.265 development, only to see their hard work given away for free. The stock answers included “there will be winners and losers” and “some companies will probably be bought out.” Note that the primary goal of the members I listed is content delivery, not living off patent royalties, so that gives you some insights to their thinking.

The last puzzle piece was the new ATSC 3.0 standard for digital TV broadcasting, and it’s being tried out in several markets as I write this; most notably, Phoenix. ATSC 3.0 is not compatible with the current version 1.0 as it uses a different modulation process (ODM vs. VSB) and is very much intertwined with IP to make delivery to mobile devices practical. WRAL in Raleigh, North Carolina has been broadcasting in this format for almost a year now.

ATSC 3.0 is already being tested in several TV markets. Will it take off? And how will consumers choose to watch it?

 

CreateLED had this cool LED “waterfall” in their booth.

ATSC 3.0 is designed to be more bandwidth-efficient and can carry 1080p and 4K broadcasts along with high dynamic range video. At the show, I saw demos of ATSC 3.0 receivers married to 802.11ac WiFi routers, ATSC 3.0 set-top boxes, and even an autonomous shuttle vehicle between the Central and South Halls that was supposedly carrying live ATSC 3.0 mobile broadcasts. (It wasn’t working at the time, though. More crickets…)

All in all; a very subdued show, but reflective of an industry in transition from a world of deterministic video traveling uncompressed over coaxial cable to compressed audio and video packets streaming through wired and wireless networks with varying degrees of latency. Where do we go from here?

 

 

High Dynamic Range: It’s Here!

Ever since the launch of high definition television in the 1990s, it seems as if some new ‘bell and whistle’ enhancement comes along every few years. First it was the changeover to flat screen plasma displays in the late 1990s, followed by a shift to 1080p and Wide UXGA resolution in the early 2000s.

The industry transitioned to liquid-crystal display (LCD) panels for TVs and monitors a few years later. UHD (4K) imaging popped into our consciousness in 2012. And of course, 3D made its “once every few sunspot cycles” appearance in 2009, followed by virtual reality last year.

Some of these trends actually stuck, like 4K: Display manufacturers are slowing down production of Full HD (1920×1080) display panels in favor of UHD (3840×2160) as consumers increasingly choose the higher resolution. That, in turn, means that the displays we select for digital signage, classrooms, meeting rooms, and other applications will also be of the 4K variety.

The latest trend to rear its head is high dynamic range (HDR), which is accompanied by wide color gamut (WCG) imaging. In a nutshell, HDR means a greatly expanded range of tonal values that can be shown in still and moving images. Conventional cameras and imaging systems can capture anywhere from 9 to 11 f-stops of light. (Each f-stop increase represents a luminance value twice as bright as the previous one.)

HDR takes that to a higher level by capturing as many as 22 f-stops of light, and reproducing those tonal values becomes a real challenge to displays that employ conventional backlight or illumination systems. Hence, we are now seeing a new crop of LCD TVs with turbocharged backlights to reproduce the extreme dynamic ranges of HDR images. On the emissive display side, organic light-emitting diode (OLED) TVs can also reproduce HDR content, although with lower peak brightness levels.

At NAB 2017, NEC showed this 4K HDR encoder prototype, streaming 77 Mb/s with 99 ms latency.

For some perspective, the venerable CRT display had a peak brightness level somewhere around 29 foot-Lamberts (100 candelas per square meter), which represented close to 100% diffuse white. In an HDR display, that value largely holds, but more intense specular highlights (like the sun reflecting off a pane of glass or the water, or a bright streetlight at nighttime) can hit peaks much, much higher in the thousands of cd/m2 .

And HDR isn’t just about extreme brightness. The entire grayscale is expanded, so we should see more shadow details along with intense specular light sources. When done correctly, HDR images are quite the departure from ‘everyday’ HDTV, and more closely resemble the range of tonal values our eyes can register – with their visual contrast ratio approaching 1,000,000:1.

There are numerous ways to achieve higher levels of brightness. Dense arrays of light-emitting diodes can do it when used in a direct-illumination architecture. However, the favored approach is to employ a special optical film embedded with nano-sized red and green quantum dot particles, stimulated by an array of blue LEDs. 2017 TV models using this approach can achieve peak small-area brightness values of 2,000 cd/m2.

For perspective, consider that an LED (emissive) videowall for indoor use will routinely hit 3,000 cd/m2 brightness with full white images, and you can appreciate just how much of a leap HDR represents over current imaging technology. What’s more significant is how quickly the prices for HDR displays are coming down, particularly as Chinese TV manufacturers enter the marketplace.

Just prior to the Super Bowl – the best time to score a deal on a new TV, by the way – it was possible to purchase a 55-inch ‘smart’ Ultra HDTV for just $499 from a Tier 1 manufacturer. And a 65-inch model with basic HDR (static metadata) could be had from a Chinese brand for less than $700, while a Tier 1 HDR model of the same screen size was less than $900.

I mentioned wide color gamut earlier. It stands to reason that if a camera can capture a much wider range of luminance values, it can also record a much wider range of color shades. And that’s exactly what winds up happening. With the current 8-bit color system widely in use for everything from broadcast and cable television to Blu-ray discs and streaming media, a total of 16.7 million colors can be represented.

With HDR and WCG, the playing field is expanded considerably and now requires 10 bits per color pixel, resulting in 1,073,741,800 colors – over 1 billion color shades! That’s too much heavy lifting for LCD displays that use white LEDs with color filters, but it’s within reach of quantum dot LCDs and OLEDs.

The availability of HDR/WCG content has also forced a speed upgrade to display interfaces. HDMI 1.3/1.4 simply can’t handle a 4K HDR signal, so we must use HDMI 2.0 to do the job. And even version 2.0 is barely fast enough – if the 4K video signal uses lower color resolution (4:2:0, 4:2:2), then it can transport HDR signals as fast as 60 Hz. But switch to RGB (4:4:4) color mode – such as we’d see with 4K video from a computer video card – and HDMI 2.0 can’t pass a 60 Hz signal with anything more than 8-bit color.

On the DisplayPort side, things are somewhat more accommodating. Version 1.2 (the current one) can pass a 3840x2160p/60 signal 10-bit RGB (4:4:4) color, but nothing more. The newest DP version – 1.3 – raises its maximum speed to 32.4 Gb/s, which makes imaging 12-bit and even 16-bit 4K HDR content possible. However, version 1.4 is required to recognize the HDR ‘flags’ that travel with the content and must be passed on to the display. (HDMI uses extensions for HDR and WCG, with ‘a’ used for static HDR metadata and ‘b’ used for dynamic metadata.)

Marketing folks have a field day confusing people with new display tech and apparently they’re going to town with HDR. We’re now hearing about “HDR-compatible” products, particularly in signal interfacing. Nothing to see here, folks – if the signal distribution and switching equipment is fast enough to pass the required clock rate and hands over HDR metadata (CTA.861.3) to the display without alteration, then it is indeed “HDR compatible.” Simple as that.

I return to my original question: Will HDR have an impact on our industry? The answer is an emphatic “yes!” There are many customers that would realize a benefit from HDR imaging – medical, surveillance, military, research, virtual reality, and simulation verticals will embrace it pretty quickly, and others will follow.

InfoComm 2016 In The Rearview Mirror

Another InfoComm show has come and gone. This is my 23rd InfoComm and it’s hard to imagine when I first set foot in Anaheim way back in 1994 – ostensibly to cover the now-defunct Projection Shoot-Out – that I’d still be making the treks to Orlando and Las Vegas, let alone teaching classes and joining the InfoComm faculty.

For this recap, I’ll focus on trends I saw at the show that will continue to impact our industry for some time to come. And there were plenty of them, everywhere you looked.

First off; I’ve been saying for several years now that software is becoming increasingly more important than hardware in our industry (and across all market segments  – look at how inexpensive Ultra HDTVs have become already), and that we’d start to see less of a focus on expensive hardware and more of an emphasis on software and managed services.

And that’s exactly what I spotted in Las Vegas. Astute observers noticed that the once humongous booths set up by the likes of Sony, Panasonic, Crestron, LG, Samsung, Hitachi, and other companies have gotten a bit smaller. (NEC, Da-Lite, and Christie were exceptions to the rule.)

AMX, when it was a stand-alone company, used to have an enormous booth at the show (not to mention a huge party every year). Now, AMX is limited to a few small stands within the Harman booth.  Walk the show floor these days and you’ll recognize other once-mighty brands that have been acquired by holding companies and now occupy much smaller footprints.

And this trend shouldn’t be any surprise. When hardware used to sell for four and five figures (and in some cases, six figures), you could justify those million-dollar booths that looked like mini-malls. (Remember the huge tented Sanyo projector booths?) But that’s not the case anymore.

Kramer's huge booth at InfoComm, touting a shift away from

Kramer’s huge booth at InfoComm, touting a shift away from “big boxes” to software and the cloud, was one of the exceptions to the trend to go smaller.

 

LG is doing some very cool things with curved displays, thanks to advancements in OLED and LCD manufacturing.

LG is doing some very cool things with curved displays, thanks to advancements in OLED and LCD manufacturing.

Practically speaking, how much real estate do you need to talk about software programs and managed services? The same thing is happening at NAB, where once humongous companies like Harris (now Imagine) are largely touting services and not hardware.

Even Digital Projection has scaled back its enormous multi-tier InfoComm booth. And projectiondesign has shed some square footage since being acquired by Barco, which has itself gone on a square footage diet. Ditto Sharp, which had one of the smallest booths ever at this show, perhaps related to the company’s ongoing financial challenges.

Surprisingly, Toshiba showed there is indeed a second act by showing up with a nice-size booth full of LCD monitors for tiled display walls. That’s not exactly an easy market to compete in, what with LG, Samsung, and NEC having a big footprint. But they’re giving it a shot.

Toshiba has re-entered the super-competitive world of display walls...a market they once dominated 20 year ago.

Toshiba has re-entered the super-competitive world of display walls…a market they once dominated 20 year ago.

 

The

The “surfer dude engineers” from Santa Barbara have a very nice 4K-over-IP encoder/decoder line-up!

Another trend that’s really picking up speed is the move away from projection lamps to solid-state illumination systems, most often lasers with color phosphor wheels. The availability of large, inexpensive LCD displays has cut deeply into sales of projectors – particularly in small classrooms and meeting rooms, where we used to put in “hang and bang” projection systems.

If you talk to people who’ve made the switch away from projection to direct-view, the reason they most frequently cite is that they don’t have to change out lamps anymore, and the LCD displays can be used under normal room lighting and turn on instantly.

Well, projector manufacturers have gotten the message and are moving en masse to solid state light sources. Early adopters like Casio have reaped the benefits, but now everyone from Sony and Panasonic to Vivitek and Optoma is on board.

Even so, the corner wasn’t really turned until this year when Epson – one of the big manufacturers of projection lamps – showed a 25,000-lumen 3LCD projector powered by a laser light engine. And I saw more than one UHD-resolution projector using the laser-phosphor combination, even in ultra-short throw configurations.

Epson finally got religion and showed its first laser/phosphor 3LCD projector this year - a 25,000 lumens model.

Epson finally got religion and showed its first laser/phosphor 3LCD projector this year – a 25,000 lumens model.

 

And Panasonic harnessed laser/phosphor technology to a new high-brightness 4K projector.

And Panasonic harnessed laser/phosphor technology to a new high-brightness 4K projector.

How much longer will we be changing out lamps? I don’t think it will be more than a few years before the majority of projectors offered for sale will use laser or LED light engines (or both). There will be exceptions for certain models, but for all intents and purposes, short-arc lamps are toast.

Here’s another trend – LED walls. I tried to count all of the exhibitors at InfoComm and lost track after wandering through the North Hall. And just about every single exhibitor was based in China, with names you would not recognize. Were they looking for U.S. dealer/distributor partners? It’s not likely many would pick up customers here, and that may be why Leyard (another Chinese manufacturer) bought Planar last year – everyone knows who Planar is.

I also saw LED walls with pitches as small as .9mm. That’s smaller than the pixel pitch of a 50-inch 1366×768 plasma monitor from 1995! And if anyone continues to go big with their booths, it’s the LED wall manufacturers. (Not like they have any choice!) Leyard’s 100’+ 8K LED wall was a perfect example of why bigger is still better when it comes to a booth.

And Sony’s Cledis 8Kx2K LED wall shows just how much farther we’ve come with this technology, creating what appeared to be a perfectly seamless, pixel-free panoramic LED wall that dazzled with bright, super-saturated color images.

Sony's CLEDIS 8K x 2K LED wall did an excellent job of hiding its seams - and pixels.

Sony’s CLEDIS 8K x 2K LED wall did an excellent job of hiding its seams – and pixels.

 

Planar (Leyard) is building some amazingly big and bright display walls. And they've got 8K resolution, too, thanks to using 16 2K panels.

Planar (Leyard) is building some amazingly big and bright display walls. And they’ve got 8K resolution, too, thanks to using 16 2K panels.

The Chinese dominance in LED displays shouldn’t be surprising. They’re moving to a similar level in the manufacturing of LCD panels, monitors, and televisions, undermining the Korean manufacturers (who undermined the Japanese, who took our U.S.-based television business away in the 1980s).

In fact, so much of our hardware is fabricated, soldered, and assembled in China and Southeast Asia these days that it should be no surprise prices have dropped as much as they have. Back in the day, a quality line doubler (remember those?) would set you back as much as $5,000 to $8,000. Today, you can buy a compact scaler that works to 1080p and Wide UXGA for a few hundred bucks.

My last trend has to do with the slow migration of video and audio signal distribution and switching away from hardware-intensive platforms based on display interface standards to software-based platforms that use IT switches, encoders, and decoders. Wow, did I spot a lot of those products at the show, even from some previously-vigorous defenders of HDMI-based architectures.

The interest in learning how to move to an “open” IP-type AV distribution architecture must be considerable: I taught a class on AV-over-IP this year at InfoComm and was astounded to see that 185 people had signed up to attend. And there were very few no-shows, as I found out when I had attendees sitting on the floor and standing along the back wall for almost the entire 90-minute class.

You know there's considerable interest in AV-over-IP when these guys show up.

You know there’s considerable interest in AV-over-IP when these guys show up.

 

RGB Spectrum's new Zio AV-over-IP system has one of the most user-friendly interfaces I've seen to date - touch and swipe to connect video streams.

RGB Spectrum’s new Zio AV-over-IP system has one of the most user-friendly interfaces I’ve seen to date – touch and swipe to connect video streams.

What’s more, a substantial portion of those attendees came from the higher education market segment, and an informal poll revealed that most of them were still upgrading from older analog systems to all-digital infrastructure. In essence, they were telling me that they preferred to skip by HDMI-based solutions and move directly to an IP-type solution.

Hand-in-hand with this discovery came more responses about transitioning to app-based AV control systems and away from proprietary, code-based control that requires specialized programming. Well, there were a few companies showing app-based AV control products in Vegas that had super-simple GUIs; software that just about anyone could learn to use in a few hours.

Throw in the accelerating transition to UHD resolution displays (they’ll largely replace Full HD within a year), and you have some very interesting times in store for the AV industry as this decade winds on…

AV-over-IP: It’s Here. Time To Get On Board!

At InfoComm next week in Las Vegas, I look forward to seeing many familiar faces – both individuals and manufacturers – that have frequented the show since I first attended over 20 years ago. And I also expect to find quite a few newcomers, based on the press releases and product announcements I’ve been receiving daily.

Many of those newcomers will be hawking the latest technology – AV-over-IP. More specifically, transporting video, audio, and metadata that are encoded into some sort of compressed or lightly-compressed format, wrapped with IP headers, and transported over IP networks.

This isn’t exactly a new trend: The broadcast, telecom, and cable/satellite worlds have already begun or completed the migration to IT infrastructures. The increasing use of optical fiber and lower-cost, fast network switches are making it all possible. Think 10 gigabit Ethernet with single-mode fiber interconnections, and you can see where the state-of-the-art is today.

You’ve already experienced this AV-over-IP phenomenon if you watch streaming HD and 4K video. Home Internet connection speeds have accelerated by several orders of magnitude ever since the first “slow as a snail” dial-up connections got us into AOL two decades ago. Now, it’s not unusual to have sustained 10, 15, 25, and even 50 megabit per second (Mb/s) to the home – fast enough to stream Ultra HD content with multichannel sound.

And so it goes with commercial video and audio transport. Broadcast television stations had to migrate to HD-SDI starting nearly 20 years ago when the first HDTV broadcasts commenced. (Wow, has it really been that long?) Now, they’re moving to IP and copper/fiber backbones to achieve greater bandwidth and to take advantage of things like cloud storage and archiving.

So why hasn’t the AV industry gotten with the program? Because we still have a tendency to cling to old, familiar, and often outdated or cumbersome technology, rationalizing that “it’s still good enough, and it works.” (You know who you are…still using VGA and composite video switching and distribution products…)

I’ve observed that there is often considerable and continual aversion in our industry to anything having to do with IT networks and optical fiber. And it just doesn’t make any sense. Maybe it originates from a fear of losing control to IT specialists and administrators. Or, it could just be a reluctance to learn something new.

The result is that we’ve created a monster when it comes to digital signal management. Things were complicated enough when the AV industry was dragged away from analog to digital and hung its hats on the HDMI consumer video interface for switching and distribution. Now, that industry has created behemoth switch matrices to handle the current and next flavors of HDMI (a format that never was suitable for commercial AV applications).

We’ve even figured out a way to digitize the HDMI TMDS signal and extend it using category wire, up to a whopping 300 feet. And somehow, we think that’s impressive? Single-mode fiber can carry an HD video signal over 10 miles. Now, THAT’S impressive – and it’s not exactly new science.

So, now we’re installing ever-larger racks of complex HDMI switching and distribution gear that is expensive and also bandwidth-capped – not nearly fast enough for the next generation of UHD+ displays with full RGB (4:4:4) color, high dynamic range, and high frame rates. How does that make any sense?

What’s worse, the marketing folks have gotten out in front, muddying the waters with all kinds of nonsensical claims about “4K compatibility,” “4K readiness,” and even “4K certified.” What does that even mean? Just because your switch or DA product can support a very basic level of Ultra HD video with slow frame rates and reduced color resolution, it’s considered “ready” or “certified?” Give me a break.

Digitizing HDMI and extending it 300 feet isn’t future-proof. Neither is limiting Ultra HD bandwidth to 30 Hz 8-bit RGB color, or 60 Hz 8-bit 4:2:0 color. Not even close. Not when you can already buy a 27-inch 5K (yes, 5K!) monitor with 5120×2880 resolution and the ability to show 60 Hz 10-bit color. And when 8K monitors are coming to market.

So why we keep playing tricks with specifications, and working with Band-Aid solutions? We shouldn’t. We don’t need to. And the answer is already at hand.

It’s time to move away from the concept of big, bulky, expensive, and basically obsolete switching and distribution hardware that’s based on a proprietary consumer display interface standard. It’s time to move to a software-based switching and distribution concept that uses an IT structure, standard codecs like JPEG2000, M-JPEG, H.264, and H.265, and everyday off-the-shelf switches to move signals around.

Now, we can design a fast, reliable AV network that allows us to manage available bandwidth and add connections as needed. Our video can be lightly compressed with low latency, or more highly compressed for efficiency. The only display interfaces we’ll need will be at the end points where the display is connected.

Even better, our network also provides access to monitoring and controlling every piece of equipment we’ve connected. We can design and configure device controls and interfaces using cloud-based driver databases. We can access content from remote servers (the cloud, again) and send it anywhere we want. And we can log in from anywhere in the world to keep tabs on how it’s all functioning.

And if we’re smart and not afraid to learn something new, we’ll wire all of it up with optical fiber, instead of bulky cables or transmitters and receivers to convert the signals to a packet format and back. (Guess what? AV-over-IP is already digital! You can toss out those distance-limited HDMI extenders, folks!)

For those who apparently haven’t gotten the memo, 40 Gb/s network switches have been available for a few years, with 100 Gb/s models now coming to market. So much for speed limit issues…

To the naysayers who claim AV-over-IP won’t work as well as display interface switching: That’s a bunch of hooey. How are Comcast, Time Warner, NBC, Disney, Universal, Netflix, Amazon, CBS, and other content originators and distributors moving their content around? You guessed it.

AV-over-IP is what you should be looking for as you walk the aisles of the Las Vegas Convention Center, not new, bigger, and bulkier HDMI/DVI matrices. AV-over-IP is the future of our industry, whether we embrace it or are dragged into it, kicking and screaming.

Are you on board, or what?