Posts Tagged ‘Ultra HD’

Blu-Ray: On The Endangered Species List?

One of the problems with market research is that you often wind up with conflicting data from two or more sources. Or, the data presents a “conclusion” that’s all too easy to “spin” to advance an argument or make a point.

Ever since the two adversaries in the blue laser optical disc format squared off with pistols at twenty paces in 2008 (and one lost), the clear trend of media consumption has favored streaming and digital downloads. Entire business models have collapsed as a result, including Hollywood Video and Blockbuster Video sales and rental stores. The last two Blockbuster outlets in Alaska are closing, leaving just one solitary brick-and-mortar operation in Oregon.

With Netflix now serving over 100 million subscribers around the world and Amazon rumored to be working on a smart TV for delivering Prime video, the tide hasn’t stopped rising. Purchases of digital downloads and streaming media surpassed physical media in dollar value way back in 2015 and the gap continues to widen as more customers take advantage of fast broadband, smarter DVRs, and improved codecs for reliable delivery of Full HD AND 4K video over networks.

My industry colleague Greg Tarr recently posted a story on the HD GURU Web site quoting NPD Group analyst Stephen Baker as saying that, “…Ultra HD Blu-ray player sales increased by more than 150% over 2017 and the revenue is up 61%. The {Average Selling Price] ASP is $165 this year compared to $272 for the first 5 months of 2017.” Baker further pointed out that that sales of Ultra HD Blu-ray players in the United States increased 82% in May and revenue increased 13% with an ASP of $168. NPD estimates that 4K Ultra HD players represented about 15% of Blu-ray unit sales for the first five months of 2018.

Well, that certainly sounds like great news, doesn’t it? But some perspective is in order.

First off, all of these $168 players (which once cost north of $300 – $500 not long ago) also have built-in WiFi connections and can stream content from the likes of Netflix, Amazon, YouTube, and Hulu. And of course, they’re backward-compatible with standard Blu-ray, DVD, and CD audio formats.

Given the ridiculously low prices on Ultra HDTVs these days (such as 55-inch models with HDR 10 support for as low as $450), many consumers may simply be in a major TV and home entertainment upgrade cycle. I bought my first 1080p TV in 2008, a 42-inch Panasonic plasma, for about $1200. And I’m now ready to upgrade from a 2012-vintage, 47-inch 1080p LCD model, to a 55-inch or 60-inch smart 4K set, which with HDR support will cost me about as much as that 42-inch Panasonic from 2008.

Will I pick up an Ultra HD player too? Hey, for $150, why not? And will I watch a lot of UHD Blu-ray discs on it? Probably not, since I will be able to stream Netflix and Prime video at 4K resolution. Will that streamed 4K content look as good as a physical disc playing out at more than 100 Mb/s? Maybe not, but on the other hand, I won’t have to buy or rent any more discs. And based on my experience the other night watching “The Catcher Was A Spy” from Amazon Prime, I will be quite happy with the result.

Yes, you can buy a 4K TV at Shop Rite, available in the bread aisle. (Photo courtesy Norm Hurst)

As the saying goes, facts are stubborn things. The facts are; physical media sales have been in slow and steady decline for over a decade (and continue to decline) and Ultra HD BD disc sales constitute a small portion of overall media consumption. For that matter, so do sales of players: Research firm Futuresource predicts that global UHD Blu-ray player unit shipments should hit just 2.3 million, with more than 50% of those sales taking place in North America.

To put that in perspective, ABI Research forecasts that worldwide Ultra HD flat panel TV shipments will surpass 102 million in 2018, representing 44% of all WW flat panel TV shipments (about 232 million). So even with “record” sales growth, Ultra HD Blu-ray player sales will only constitute about 2.2% of Ultra HDTV sales, with the bulk of those player sales taking place in North America and Europe.

ABI also predicts that just shy of 200 million Ultra HDTVs will be sold in 2023 worldwide, with the majority taking place in China (which doesn’t use our Blu-ray format but instead relies on “China Blue,” the old HD-DVD standard). Coincidentally, Tarr’s article states that, “…market research predicts that blue laser optical disc player shipments will decrease from 72.1 million in 2017 to 68 million in 2023. Unit shipments for the global Blu-ray media market are expected to decrease from 595 million in 2017 to 516 million in 2023.”

That trend would seem to be at odds with TV purchases, according to an April press release from Futuresource. “We believe 4K UHD TV sets will ship over 100 million units this year, equivalent to two-thirds of the entire large screen market,” comments David Tett, Market Analyst at Futuresource Consulting. “Consumers increasingly want larger screens, and this is playing nicely into the 4K UHD proposition. HDR is expected to be present in 60% of 4K UHD sets this year.”

Digesting all of this data reveals that (a) 4K TV sales continue grow to worldwide (which is also being driven by a changeover from Full HD to 4K TV fab production, but that’s another story), (b) 4K TV sales will constitute an ever-larger percentage of overall TV sales by 2023 – if not close to 90%, (c) more and more consumers are streaming and downloading digital video than purchasing optical discs, (d) even with strong sales through the first six months of 2018, Ultra HD Blu-ray players are selling at a rate of just two for every 100 Ultra HDTVs purchased, and (e) overall sales of Blu-ray players of all kinds are in steady decline.

I fully expect to hear all of the arguments for UHD Blu-ray, picture quality being one of them. But if I can stream UHD content with HDR at acceptable quality levels, why do I need to buy discs? I’ll have access to an enormous cloud library and I’ll be more environmentally conscious, too. Besides, I rarely watch a movie more than once (look at the piles of old DVDs people try to get rid of at garage sales or foist on libraries). There’s plenty of good content available from video-on-demand.

Ultra HD video content with HDR @ 16 Mb/s that looks as good as UHD Blu-ray? Yep, Fraunhofer IHS showed it at NAB 2016.

And UHD BD supporters neglect to consider all of the continual advancements being made with codecs. A couple of years ago, Fraunhofer showed absolutely stunning Ultra HD video with dynamic HDR on a 65-inch UHDTV, encoded with HEVC H.265 at an average bit rate of 16 Mb/s – 15% of the peak streaming rate for Ultra HD Blu-ray – and they were encoding tricky stuff like confetti, wind-whipped waves, and moving objects with plenty of changing specular highlights. All heavy lifting.

Granted, it took two computers to do the software encoding and decoding. But those two computers can easily be reduced to a set of chips with firmware and a powerful CPU and installed inside my next TV.

So what would I need an optical disc player for?

Heads Up! Here Comes 8K TV (or, The Case Of The Amazing Vanishing Pixels)

Yes, you read that right: 8K displays are coming. For that matter, 8K broadcasting has already been underway in Japan since 2012, and several companies are developing 8K video cameras to be shown at next month’s NAB show in Las Vegas.

“Hold on a minute!” you’re probably thinking. “I don’t even own a 4K TV yet. And now they’re already on the endangered species list?”

Well, not exactly. But two recent press releases show just how crazy the world of display technology has become.

The first release came from Insight Media in February and stated that, “The 2020 Tokyo Olympics will be a major driver in the development of 8K infrastructure with Japanese broadcaster NHK leading efforts to produce and broadcast Olympic programming to homes…cameras from Hitachi, Astrodesign, Ikegami, Sharp and Sony address the many challenges in capturing 8K video…the display industry plans for massive expansion of Gen 10.5 capacity, which will enable efficient production of 65″ and 75″ display panels for both LCD and OLED TV…. sales of 8K Flat Panel TVs are expected to increase from 0.1 million in 2018 to 5.8 million in 2022, with China leading the way representing more than 60% of the total market during this period.”

Read it again. Almost 6 million 8K LCD and OLED TVs are expected to be sold four years from now, and over 3 million of those sales will be in China.

But there’s more. Analyst firm IHS Markit issued their own forecasts for 8K TV earlier this month, predicting that, While ultra-high definition (UHD) panels are estimated to account for more than 98 percent of the 60-inch and larger display market in 2017, most TV panel suppliers are planning to mass produce 8K displays in 2018. The 7680 x 4320-pixel resolution display is expected to make up about 1 percent of the 60-inch and larger display market this year and 9 percent in 2020.”

According to HIS Markit, companies with skin in the 8K game include Innolux, which will supply 65-inch LCD panels to Sharp for use in consumer televisions and in commercial AV displays. Meanwhile, Sharp – which had previously shown an 85-inch 8K TV prototype – will ramp up production of a new 70-inch 8K LCD display (LV-70X500E) in their Sakai Gen 10 LCD plant. This display was shown in Sharp’s booth at ISE, along with their new 8K video camera.

Sharp showed this 8K camera (BC-B60A) at ISE…

 

…feeding this 70-inch 8K LCD monitor (LV-70X500E), a new glass cut from the Sakai Gen 10 fab.

Sony and Samsung are also expected to launch 8K LCD TVs this year. Both companies showed prototypes at CES with Samsung’s offering measuring about 85 inches. Sony’s prototype also measured 85 inches but included micro light-emitting diodes (LEDs) in the backlight to achieve what Sony described as “full high dynamic range,” achieving peak (specular) brightness of 10,000 nits. (That’ll give you a pretty good sunburn!)

Oher players in 8K include LG Display, who already announced an 88-inch 8K OLED TV prior to CES, and Chinese fabricators BOE, AUO, and China Electronics Corporation (CEC). What’s even more interesting is that some of these 8K LCD and OLED panels will be equipped with indium gallium zinc oxide (IGZO) switching transistors.

No, IGZO isn’t a cure for aging. But what it does is provide much higher pixel density in a given screen size with lower power consumption. More importantly, it will allow these 8K TVs to refresh their pictures as fast as 120 Hz – double the normal refresh rate we use today. And that will be important as High Frame Rate (HFR) video production ramps up.

LG Display’s 88-inch 8K OLED display was a real eye-catcher at CES 2018.

Predictably, prices for TVs and monitors using panels with 4K resolution are collapsing. In the AV channel, 4K (Ultra HD) displays are only beginning to show up in product lines, but manufacturers are well aware of pricing trends with Ultra HD vs. Full HD (1920x1080p). With some consumer models now selling for as little as $8 per diagonal inch, the move from Full HD to 4K / Ultra HD will pick up lots of steam.

And with 8K displays now becoming a ‘premium’ product, 4K / Ultra HD will be the ‘everyday’ or mainstream display offering in screen sizes as small as 40 inches and as large as – well, you name it. We’ve already seen 84-inch, 88-inch, and 98-inch commercial displays, and prototypes as large as 120 inches – yes, 10’ of diagonal screen, wrap your head around that – have been exhibited at CES and other shows.

We saw quite a few demonstrations of 4K commercial displays at ISE and expect to see a whole lot more at InfoComm in June, along with the inevitable price wars. And there will be the usual “my encoder handles 4K better than yours with less latency” battles, shoot-outs, and arguments. But that could ultimately turn out to be the appetizer in this full-course meal.

For companies manufacturing signal distribution and switching equipment, 4K / Ultra HD already presents us with a full plate. 8K would be too much to bite off at present! Consider that an 8K/60 video signal using 12-bit RGB color requires a data rate approaching 100 gigabits per second (Gb/s), as compared to a 12-bit, 60 Hz Full HD signal’s rate of about 6 Gb/s, and you can see we will have some pretty steep hills to climb to manage 8K.

Distributing 8K over a network will be equally challenging and will require switching speeds somewhere north of 40 Gb/s even for a basic form of 8K video, which (we assume) will also incorporate high dynamic range and wide color gamuts. 40 Gb/s switches do exist but are pricey and would require 8K signals to be compressed by at least 25% to be manageable. And they’d certainly use optical fiber for all their connections.

To be sure, 4K / Ultra HD isn’t on the endangered species just yet. (For that matter, you can still buy Full HD monitors and TVs, if that’s any comfort.) But whether it makes sense or not – or whether we’re ready or not – it’s “full speed ahead” for 8K displays as we head into the third decade of the 21st century…

High Dynamic Range: It’s Here!

Ever since the launch of high definition television in the 1990s, it seems as if some new ‘bell and whistle’ enhancement comes along every few years. First it was the changeover to flat screen plasma displays in the late 1990s, followed by a shift to 1080p and Wide UXGA resolution in the early 2000s.

The industry transitioned to liquid-crystal display (LCD) panels for TVs and monitors a few years later. UHD (4K) imaging popped into our consciousness in 2012. And of course, 3D made its “once every few sunspot cycles” appearance in 2009, followed by virtual reality last year.

Some of these trends actually stuck, like 4K: Display manufacturers are slowing down production of Full HD (1920×1080) display panels in favor of UHD (3840×2160) as consumers increasingly choose the higher resolution. That, in turn, means that the displays we select for digital signage, classrooms, meeting rooms, and other applications will also be of the 4K variety.

The latest trend to rear its head is high dynamic range (HDR), which is accompanied by wide color gamut (WCG) imaging. In a nutshell, HDR means a greatly expanded range of tonal values that can be shown in still and moving images. Conventional cameras and imaging systems can capture anywhere from 9 to 11 f-stops of light. (Each f-stop increase represents a luminance value twice as bright as the previous one.)

HDR takes that to a higher level by capturing as many as 22 f-stops of light, and reproducing those tonal values becomes a real challenge to displays that employ conventional backlight or illumination systems. Hence, we are now seeing a new crop of LCD TVs with turbocharged backlights to reproduce the extreme dynamic ranges of HDR images. On the emissive display side, organic light-emitting diode (OLED) TVs can also reproduce HDR content, although with lower peak brightness levels.

At NAB 2017, NEC showed this 4K HDR encoder prototype, streaming 77 Mb/s with 99 ms latency.

For some perspective, the venerable CRT display had a peak brightness level somewhere around 29 foot-Lamberts (100 candelas per square meter), which represented close to 100% diffuse white. In an HDR display, that value largely holds, but more intense specular highlights (like the sun reflecting off a pane of glass or the water, or a bright streetlight at nighttime) can hit peaks much, much higher in the thousands of cd/m2 .

And HDR isn’t just about extreme brightness. The entire grayscale is expanded, so we should see more shadow details along with intense specular light sources. When done correctly, HDR images are quite the departure from ‘everyday’ HDTV, and more closely resemble the range of tonal values our eyes can register – with their visual contrast ratio approaching 1,000,000:1.

There are numerous ways to achieve higher levels of brightness. Dense arrays of light-emitting diodes can do it when used in a direct-illumination architecture. However, the favored approach is to employ a special optical film embedded with nano-sized red and green quantum dot particles, stimulated by an array of blue LEDs. 2017 TV models using this approach can achieve peak small-area brightness values of 2,000 cd/m2.

For perspective, consider that an LED (emissive) videowall for indoor use will routinely hit 3,000 cd/m2 brightness with full white images, and you can appreciate just how much of a leap HDR represents over current imaging technology. What’s more significant is how quickly the prices for HDR displays are coming down, particularly as Chinese TV manufacturers enter the marketplace.

Just prior to the Super Bowl – the best time to score a deal on a new TV, by the way – it was possible to purchase a 55-inch ‘smart’ Ultra HDTV for just $499 from a Tier 1 manufacturer. And a 65-inch model with basic HDR (static metadata) could be had from a Chinese brand for less than $700, while a Tier 1 HDR model of the same screen size was less than $900.

I mentioned wide color gamut earlier. It stands to reason that if a camera can capture a much wider range of luminance values, it can also record a much wider range of color shades. And that’s exactly what winds up happening. With the current 8-bit color system widely in use for everything from broadcast and cable television to Blu-ray discs and streaming media, a total of 16.7 million colors can be represented.

With HDR and WCG, the playing field is expanded considerably and now requires 10 bits per color pixel, resulting in 1,073,741,800 colors – over 1 billion color shades! That’s too much heavy lifting for LCD displays that use white LEDs with color filters, but it’s within reach of quantum dot LCDs and OLEDs.

The availability of HDR/WCG content has also forced a speed upgrade to display interfaces. HDMI 1.3/1.4 simply can’t handle a 4K HDR signal, so we must use HDMI 2.0 to do the job. And even version 2.0 is barely fast enough – if the 4K video signal uses lower color resolution (4:2:0, 4:2:2), then it can transport HDR signals as fast as 60 Hz. But switch to RGB (4:4:4) color mode – such as we’d see with 4K video from a computer video card – and HDMI 2.0 can’t pass a 60 Hz signal with anything more than 8-bit color.

On the DisplayPort side, things are somewhat more accommodating. Version 1.2 (the current one) can pass a 3840x2160p/60 signal 10-bit RGB (4:4:4) color, but nothing more. The newest DP version – 1.3 – raises its maximum speed to 32.4 Gb/s, which makes imaging 12-bit and even 16-bit 4K HDR content possible. However, version 1.4 is required to recognize the HDR ‘flags’ that travel with the content and must be passed on to the display. (HDMI uses extensions for HDR and WCG, with ‘a’ used for static HDR metadata and ‘b’ used for dynamic metadata.)

Marketing folks have a field day confusing people with new display tech and apparently they’re going to town with HDR. We’re now hearing about “HDR-compatible” products, particularly in signal interfacing. Nothing to see here, folks – if the signal distribution and switching equipment is fast enough to pass the required clock rate and hands over HDR metadata (CTA.861.3) to the display without alteration, then it is indeed “HDR compatible.” Simple as that.

I return to my original question: Will HDR have an impact on our industry? The answer is an emphatic “yes!” There are many customers that would realize a benefit from HDR imaging – medical, surveillance, military, research, virtual reality, and simulation verticals will embrace it pretty quickly, and others will follow.

InfoComm 2016 In The Rearview Mirror

Another InfoComm show has come and gone. This is my 23rd InfoComm and it’s hard to imagine when I first set foot in Anaheim way back in 1994 – ostensibly to cover the now-defunct Projection Shoot-Out – that I’d still be making the treks to Orlando and Las Vegas, let alone teaching classes and joining the InfoComm faculty.

For this recap, I’ll focus on trends I saw at the show that will continue to impact our industry for some time to come. And there were plenty of them, everywhere you looked.

First off; I’ve been saying for several years now that software is becoming increasingly more important than hardware in our industry (and across all market segments  – look at how inexpensive Ultra HDTVs have become already), and that we’d start to see less of a focus on expensive hardware and more of an emphasis on software and managed services.

And that’s exactly what I spotted in Las Vegas. Astute observers noticed that the once humongous booths set up by the likes of Sony, Panasonic, Crestron, LG, Samsung, Hitachi, and other companies have gotten a bit smaller. (NEC, Da-Lite, and Christie were exceptions to the rule.)

AMX, when it was a stand-alone company, used to have an enormous booth at the show (not to mention a huge party every year). Now, AMX is limited to a few small stands within the Harman booth.  Walk the show floor these days and you’ll recognize other once-mighty brands that have been acquired by holding companies and now occupy much smaller footprints.

And this trend shouldn’t be any surprise. When hardware used to sell for four and five figures (and in some cases, six figures), you could justify those million-dollar booths that looked like mini-malls. (Remember the huge tented Sanyo projector booths?) But that’s not the case anymore.

Kramer's huge booth at InfoComm, touting a shift away from

Kramer’s huge booth at InfoComm, touting a shift away from “big boxes” to software and the cloud, was one of the exceptions to the trend to go smaller.

 

LG is doing some very cool things with curved displays, thanks to advancements in OLED and LCD manufacturing.

LG is doing some very cool things with curved displays, thanks to advancements in OLED and LCD manufacturing.

Practically speaking, how much real estate do you need to talk about software programs and managed services? The same thing is happening at NAB, where once humongous companies like Harris (now Imagine) are largely touting services and not hardware.

Even Digital Projection has scaled back its enormous multi-tier InfoComm booth. And projectiondesign has shed some square footage since being acquired by Barco, which has itself gone on a square footage diet. Ditto Sharp, which had one of the smallest booths ever at this show, perhaps related to the company’s ongoing financial challenges.

Surprisingly, Toshiba showed there is indeed a second act by showing up with a nice-size booth full of LCD monitors for tiled display walls. That’s not exactly an easy market to compete in, what with LG, Samsung, and NEC having a big footprint. But they’re giving it a shot.

Toshiba has re-entered the super-competitive world of display walls...a market they once dominated 20 year ago.

Toshiba has re-entered the super-competitive world of display walls…a market they once dominated 20 year ago.

 

The

The “surfer dude engineers” from Santa Barbara have a very nice 4K-over-IP encoder/decoder line-up!

Another trend that’s really picking up speed is the move away from projection lamps to solid-state illumination systems, most often lasers with color phosphor wheels. The availability of large, inexpensive LCD displays has cut deeply into sales of projectors – particularly in small classrooms and meeting rooms, where we used to put in “hang and bang” projection systems.

If you talk to people who’ve made the switch away from projection to direct-view, the reason they most frequently cite is that they don’t have to change out lamps anymore, and the LCD displays can be used under normal room lighting and turn on instantly.

Well, projector manufacturers have gotten the message and are moving en masse to solid state light sources. Early adopters like Casio have reaped the benefits, but now everyone from Sony and Panasonic to Vivitek and Optoma is on board.

Even so, the corner wasn’t really turned until this year when Epson – one of the big manufacturers of projection lamps – showed a 25,000-lumen 3LCD projector powered by a laser light engine. And I saw more than one UHD-resolution projector using the laser-phosphor combination, even in ultra-short throw configurations.

Epson finally got religion and showed its first laser/phosphor 3LCD projector this year - a 25,000 lumens model.

Epson finally got religion and showed its first laser/phosphor 3LCD projector this year – a 25,000 lumens model.

 

And Panasonic harnessed laser/phosphor technology to a new high-brightness 4K projector.

And Panasonic harnessed laser/phosphor technology to a new high-brightness 4K projector.

How much longer will we be changing out lamps? I don’t think it will be more than a few years before the majority of projectors offered for sale will use laser or LED light engines (or both). There will be exceptions for certain models, but for all intents and purposes, short-arc lamps are toast.

Here’s another trend – LED walls. I tried to count all of the exhibitors at InfoComm and lost track after wandering through the North Hall. And just about every single exhibitor was based in China, with names you would not recognize. Were they looking for U.S. dealer/distributor partners? It’s not likely many would pick up customers here, and that may be why Leyard (another Chinese manufacturer) bought Planar last year – everyone knows who Planar is.

I also saw LED walls with pitches as small as .9mm. That’s smaller than the pixel pitch of a 50-inch 1366×768 plasma monitor from 1995! And if anyone continues to go big with their booths, it’s the LED wall manufacturers. (Not like they have any choice!) Leyard’s 100’+ 8K LED wall was a perfect example of why bigger is still better when it comes to a booth.

And Sony’s Cledis 8Kx2K LED wall shows just how much farther we’ve come with this technology, creating what appeared to be a perfectly seamless, pixel-free panoramic LED wall that dazzled with bright, super-saturated color images.

Sony's CLEDIS 8K x 2K LED wall did an excellent job of hiding its seams - and pixels.

Sony’s CLEDIS 8K x 2K LED wall did an excellent job of hiding its seams – and pixels.

 

Planar (Leyard) is building some amazingly big and bright display walls. And they've got 8K resolution, too, thanks to using 16 2K panels.

Planar (Leyard) is building some amazingly big and bright display walls. And they’ve got 8K resolution, too, thanks to using 16 2K panels.

The Chinese dominance in LED displays shouldn’t be surprising. They’re moving to a similar level in the manufacturing of LCD panels, monitors, and televisions, undermining the Korean manufacturers (who undermined the Japanese, who took our U.S.-based television business away in the 1980s).

In fact, so much of our hardware is fabricated, soldered, and assembled in China and Southeast Asia these days that it should be no surprise prices have dropped as much as they have. Back in the day, a quality line doubler (remember those?) would set you back as much as $5,000 to $8,000. Today, you can buy a compact scaler that works to 1080p and Wide UXGA for a few hundred bucks.

My last trend has to do with the slow migration of video and audio signal distribution and switching away from hardware-intensive platforms based on display interface standards to software-based platforms that use IT switches, encoders, and decoders. Wow, did I spot a lot of those products at the show, even from some previously-vigorous defenders of HDMI-based architectures.

The interest in learning how to move to an “open” IP-type AV distribution architecture must be considerable: I taught a class on AV-over-IP this year at InfoComm and was astounded to see that 185 people had signed up to attend. And there were very few no-shows, as I found out when I had attendees sitting on the floor and standing along the back wall for almost the entire 90-minute class.

You know there's considerable interest in AV-over-IP when these guys show up.

You know there’s considerable interest in AV-over-IP when these guys show up.

 

RGB Spectrum's new Zio AV-over-IP system has one of the most user-friendly interfaces I've seen to date - touch and swipe to connect video streams.

RGB Spectrum’s new Zio AV-over-IP system has one of the most user-friendly interfaces I’ve seen to date – touch and swipe to connect video streams.

What’s more, a substantial portion of those attendees came from the higher education market segment, and an informal poll revealed that most of them were still upgrading from older analog systems to all-digital infrastructure. In essence, they were telling me that they preferred to skip by HDMI-based solutions and move directly to an IP-type solution.

Hand-in-hand with this discovery came more responses about transitioning to app-based AV control systems and away from proprietary, code-based control that requires specialized programming. Well, there were a few companies showing app-based AV control products in Vegas that had super-simple GUIs; software that just about anyone could learn to use in a few hours.

Throw in the accelerating transition to UHD resolution displays (they’ll largely replace Full HD within a year), and you have some very interesting times in store for the AV industry as this decade winds on…

Display Interfacing: Welcome to Babylon

For many years, ‘interfacing’ a video signal meant plugging in a yellow RCA or silver BNC connector that carried composite video. As picture resolution went up, computers became commonplace at work and home, and the term ‘progressive scan’ entered the lexicon, we saw the birth of S-video and then component video (YPbPr and RGB).

So we adapted, building switching and distribution gear that could handle one-, two-, and three-wire formats. All was well and good…until ‘digital’ made its grand entrance about 15 years ago.

Now, we have digital versions of component and RGB video, starting with the Digital Video Interface (DVI) and moving to High Definition Multimedia Interface (HDMI), DisplayPort, and the new superMHL interface that (according to the MHL Alliance) will start appearing on televisions as soon as December.

If I’m a consumer, I mostly don’t care about any of this. As long as I can plug in my set-top box, Blu-ray player, and other gadgets with the right cables I can find at Best Buy, this is just a bunch of alphabet soup.

However, if I’m an integrator (consumer or commercial), then I care VERY much about where all of this is heading. And if I’m paying any attention at all to the growing market for 4K and UHD, then I’m rightfully concerned about the impending problems with interfacing these signals.

superMHL is certainly fast enough to handle UHD. But you can't find it in use yet. is there a better way?

superMHL is certainly fast enough to handle UHD. But you can’t find it in pro AV gear yet. Is there a better way?

Consider that:

*Even though HDMI 2.0 was announced in September of 2013 – TWO FULL YEARS AGO – virtually no manufacturer in the pro AV space supports this interface on their switchers and distribution amplifiers. Instead, the vast majority are still providing version 1.4 while claiming these products are “4K compatible” or “4K ready” because version 1.4 is just fast enough to pass an Ultra HD (3840×260) signal at 30 Hz with 8-bit RGB color. That’s setting the bar kinda low, isn’t it?

*Some computer manufacturers don’t even support HDMI, like Apple (DisplayPort) and Lenovo (also DisplayPort). So, now you have to carry dongles everywhere you go?

*HDMI 2.0 arrives hand-in-hand with a new version of copy protection (HDCP 2.2) which is much more rigorous than versions 1.3 and 1.4. If a valid HDCP key exchange isn’t made within 20 milliseconds, the connection will shut down. Period.

*HDMI 2.0 isn’t fast enough for what UHD is turning out to be – a real departure from 1080p and Wide UXGA, with a move to 10-bit color to support high dynamic range (HDR), wide color gamuts (WCG), and high frame rates (HFR). DisplayPort 1.2 can barely support these requirements; DP version 1.3 and super MHL are better positioned to handle the job.

*The intellectual property behind HDMI and superMHL is owned by the same company – Lattice Semiconductor – and whereas once there were clear dividing lines between the two interfaces (MHL was designed originally for smartphones and tablets), they are now competing against each other. I’ve even sat in on presentations where it was explained that both could exist on consumer TVs. (And why would that make sense, again, when neither interface has been widely deployed to date, and one is clearly an improvement over the other?)

You can imagine what this trend is doing to product designers and manufacturers. Sure, HDMI is a “safe bet” for now, but what if our UHD needs quickly outstrip its maximum clock speed? DP is certainly faster and there appears to be more support for it from computer manufacturers. But super MHL is faster still. Shouldn’t your interfaces at least have a head start on display manufacturers?

This reliance on HDMI has led several manufacturers into a potential trap, investing heavily on signal distribution architectures that may quickly run into a “future-proofing” problem. In contrast; outside the commercial AV industry, everyone from cable TV system operators to broadcasters and telecom operators are busy migrating to an IP-based architecture.

Not only does IP-based architecture have the advantage of being a relatively open system, it also solves many of the speed issues as 1-gigabit and 10-gigabit networks are becoming more commonplace. (Heck, Comcast just upgraded my home Internet speeds to 75 Mb/s on downloads, which is more than fast enough for me to stream 4K content from Netflix and Amazon!)

So, why don’t we do the same in the commercial AV industry? It’s not for a lack of products – there are several companies offering AV-over-IP transmitters and receivers, along with encoders and decoders. I’ve also seen impressive demos of “middleware” used to locate, switch, and play out media assets over IP networks. All of these guys were at InfoComm 2015.

The big players in HDMI-based switching and distribution argue against AV-over-IP for in-room and short-run signal distribution, citing latency and compression issues. Well, we now have a new codec (HEVC H.265) to handle that end of things, and it’s possible to stream video and high resolutions with low latency. (How does 1920x1080p/60 at 1 to 2 Mb/s sound to you? Thought so.)

High latency is often the result of over-compression and heavy forward error correction (FEC). But if video and audio assets are streaming on bandwidth-managed, private IP networks, there isn’t a lot of forward error correction required. Group of Pictures (GOP) sizes can also increase to reduce latency. So latency is sort of a “straw man” argument. (And HDMI 2.0 will have plenty of issues with HDCP 2.2, trust me. Talk about latency…)

As for copy protection; video and audio assets streaming over IP connections have their own security protocols. Practically speaking, what could be more secure than video content streaming directly into a UHDTV, through an Ethernet connection? And you don’t even have to plug in a cable to make it work, unless you use a wired Ethernet hookup. Bandwidth issues? Well, how about 5 GHz 802.11ac channel-bonding routers? I’m getting 70+ Mb/s download speeds from mine with wired connections, and 25 – 30 Mb/s some distance from my 5 GHz wireless link.

Again, looking outside our industry, the two most common signal distribution and switching architectures are based on HD-SDI or IP (or both). Not HDMI, and certainly not HDMI-derived, structured-wire systems like HDBaseT. If the rest of the world wants to multiplex video, audio, metadata, and other low bitrate control signals, they do it over optical fiber. (Did you know that multimode fiber is cheaper than Category 6 wire?)

I’ll wrap up thing by saying that the smart move is for commercial AV integrators to move to an AV-over-IP signal distribution system at the core like everyone else, leaving the HDMI, DisplayPort, superMHL, and “whatever comes next” connections for the far ends, near the displays (if those far-end conversions are even needed at all).

Leave the core as a high-speed, copper bus or optical bus, software-based switcher. If there’s enough bandwidth (and there should be), that system can also carry local TCP/IP traffic, SMTP alerts from connected devices, and control signals to all devices. Not only does this approach free everyone from the “closed world” paradigm of HDMI, it also makes the system infinitely more appealing to end-users and facility administrators, an increasing number of whom come from the IT world.

Consider this ad that was posted recently on a listserv for higher education:

“We are looking for an experienced AV-IT Engineer for the role of Technical Guru. The position will provide planning and support for AV-IT systems used in teaching and learning spaces big and small. The person in this position will focus on design, installation, and troubleshooting of AV-IT systems in a variety of venues, including traditional classrooms, active learning classrooms, large auditoria, computer labs, and even Makerspaces…We are looking for a seasoned professional with a solid background in AV-IT systems. This is a great opportunity for a doer who is excited about not just maintaining but also shaping the future of AV-IT technology as a key element of the teaching mission of one of the world’s top universities.”

I rest my case. It’s time for the commercial AV industry to get in step with the rest of the world and move to AV-over-IP signal distribution.

Wake up. Have you smelled the coffee yet?